Training Random Asymmetric`neural' Networks towards Chaos { a Progress Report

نویسندگان

  • P. C. McGuire
  • G. C. Littlewort
  • C. Pershing
چکیده

We explore a non-Hebbian plasticity algorithm for a random asymmetric 'neural' network in synchronous, discrete time, which causes the period of the network's inherent limit cycles to quickly diverge with the plasticity parameter. The limit cycle period has a strong peak as we increase the neural units' thresholds from normal thresholds. It is much easier to increase the limit cycle period by the plasticity algorithm, when the memory of the accumulating signal of the elds at the non-ring units is non-zero.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

On the correlation dimension of recurrent neural networks

Recurrent sigmoidal neural networks with asymmetric weight matrices and recurrent neural networks with nonmonotone transfer functions can exhibit ongoing uctuations rather than settling into point attractors. It is, however, an open question if these uctuations are the sign of low dimensional chaos or if they can be considered as close to stochastic. We report on the calculation of the correlat...

متن کامل

Correlation Between Eigenvalue Spectra and Dynamics of Neural Networks

This letter presents a study of the correlation between the eigenvalue spectra of synaptic matrices and the dynamical properties of asymmetric neural networks with associative memories. For this type of neural network, it was found that there are essentially two different dynamical phases: the chaos phase, with almost all trajectories converging to a single chaotic attractor, and the memory pha...

متن کامل

Discrete Chaos *

We propose a theory of deterministic chaos for discrete systems, based on their representations in symbolic history spaces Ω = (B ∞ , T ∆). These are spaces of semi-infinite sequences, as the one-sided shift spaces, but endowed with a more general topology T ∆ which we call a semicausal topology. We show that Ω is a metrizable Cantor set which embeds the chaotic attractor Λ. We discuss metrical...

متن کامل

Chaos in random neural networks.

A continuous-time dynamic model of a network of N nonlinear elements interacting via random asymmetric couplings is studied. A self-consistent mean-field theory, exact in the N ~ limit, predicts a transition from a stationary phase to a chaotic phase occurring at a critical value of the gain parameter. The autocorrelations of the chaotic flow as well as the maximal Lyapunov exponent are calcula...

متن کامل

Is chaos good for learning?

This paper demonstrates that an artificial neural network training on time-series data from the logistic map at the onset of chaos trains more effectively when it is weakly chaotic. This suggests that a modest amount of chaos in the brain in addition to the ever present random noise might be beneficial for learning. In such a case, human subjects might exhibit an increased Lyapunov exponent in ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 1992